Asymptotic Normality of Scaling Functions

نویسندگان

  • Louis H. Y. Chen
  • Tim N. T. Goodman
  • S. L. Lee
چکیده

The Gaussian function G(x) = 1 p 21⁄4 e¡x 2=2; which has been a classical choice for multiscale representation, is the solution of the scaling equation G(x) = Z R ®G(®x¡ y)dg(y); x 2 R; with scale ® > 1 and absolutely continuous measure dg(y) = 1 p 21⁄4(®2 ¡ 1) e¡y 2=2(®2¡1)dy: It is known that the sequence of normalized B-splines (Bn); where Bn is the solution of the scaling equation Á(x) = n X j=0 1 2n¡1 μ n j ¶ Á(2x¡ j); x 2 R; converges uniformly to G: The classical results on normal approximation of binomial distributions and the uniform B-splines are studied in the broader context of normal approximation of probability measures mn; n = 1; 2; : : : ; and the corresponding solutions Án of the scaling equations Án(x) = Z R ®Án(®x¡ y)dmn(y); x 2 R: Various forms of convergence are considered, and orders of convergence obtained. A class of probability densities are constructed that converge to the Gaussian function faster than the uniform B-splines.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Some Asymptotic Results of Kernel Density Estimator in Length-Biased Sampling

In this paper, we prove the strong uniform consistency and asymptotic normality of the kernel density estimator proposed by Jones [12] for length-biased data.The approach is based on the invariance principle for the empirical processes proved by Horváth [10]. All simulations are drawn for different cases to demonstrate both, consistency and asymptotic normality and the method is illustrated by ...

متن کامل

Asymptotic Behaviors of Nearest Neighbor Kernel Density Estimator in Left-truncated Data

Kernel density estimators are the basic tools for density estimation in non-parametric statistics.  The k-nearest neighbor kernel estimators represent a special form of kernel density estimators, in  which  the  bandwidth  is varied depending on the location of the sample points. In this paper‎, we  initially introduce the k-nearest neighbor kernel density estimator in the random left-truncatio...

متن کامل

Nonparametric regression with rescaled time series errors

We consider a heteroscedastic nonparametric regression model with an autoregressive error process of finite known order p. The heteroscedasticity is incorporated using a scaling function defined at uniformly spaced design points on an interval [0,1]. We provide an innovative nonparametric estimator of the variance function and establish its consistency and asymptotic normality. We also propose ...

متن کامل

Asymptotic normality of fringe subtrees and additive functionals in conditioned Galton – Watson trees . ( Extended abstract )

We consider conditioned Galton–Watson trees and show asymptotic normality of additive functionals that are defined by toll functions that are not too large. This includes, as a special case, asymptotic normality of the number of fringe subtrees isomorphic to any given tree, and joint asymptotic normality for several such subtree counts. The offspring distribution defining the random tree is ass...

متن کامل

Asymptotic normality of fringe subtrees and additive functionals in conditioned Galton-Watson trees

We consider conditioned Galton–Watson trees and show asymptotic normality of additive functionals that are defined by toll functions that are not too large. This includes, as a special case, asymptotic normality of the number of fringe subtrees isomorphic to any given tree, and joint asymptotic normality for several such subtree counts. Another example is the number of protected nodes. The offs...

متن کامل

Semiparametric regression estimation using noisy nonlinear non invertible functions of the observations

We investigate a semiparametric regression model where one gets noisy non linear non invertible functions of the observations. We focus on the application to bearings-only tracking. We first investigate the least squares estimator and prove its consistency and asymptotic normality under mild assumptions. We study the semiparametric likelihood process and prove local asymptotic normality of the ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • SIAM J. Math. Analysis

دوره 36  شماره 

صفحات  -

تاریخ انتشار 2004